4,901 research outputs found

    Bayesian data assimilation in shape registration

    Get PDF
    In this paper we apply a Bayesian framework to the problem of geodesic curve matching. Given a template curve, the geodesic equations provide a mapping from initial conditions\ud for the conjugate momentum onto topologically equivalent shapes. Here, we aim to recover the well defined posterior distribution on the initial momentum which gives rise to observed points on the target curve; this is achieved by explicitly including a reparameterisation in the formulation. Appropriate priors are chosen for the functions which together determine this field and the positions of the observation points, the initial momentum p0 and the reparameterisation vector field v, informed by regularity results about the forward model. Having done this, we illustrate how Maximum Likelihood Estimators (MLEs) can be used to find regions of high posterior density, but also how we can apply recently developed MCMC methods on function spaces to characterise the whole of the posterior density. These illustrative examples also include scenarios where the posterior distribution is multimodal and irregular, leading us to the conclusion that knowledge of a state of global maximal posterior density does not always give us the whole picture, and full posterior sampling can give better quantification of likely states and the overall uncertainty inherent in the problem

    Measuring Securities Market Efficiency in the Regulatory Setting

    Get PDF
    In Nov 1998, the SEC proposed a modification to the federal securities law disclosure requirements to facilitate the process of issuing new securities. Thomas and Cotter discuss how to determine when companies should be able to issue simplified disclosure documents

    Solving the Poisson equation on small aspect ratio domains using unstructured meshes

    Full text link
    We discuss the ill conditioning of the matrix for the discretised Poisson equation in the small aspect ratio limit, and motivate this problem in the context of nonhydrostatic ocean modelling. Efficient iterative solvers for the Poisson equation in small aspect ratio domains are crucial for the successful development of nonhydrostatic ocean models on unstructured meshes. We introduce a new multigrid preconditioner for the Poisson problem which can be used with finite element discretisations on general unstructured meshes; this preconditioner is motivated by the fact that the Poisson problem has a condition number which is independent of aspect ratio when Dirichlet boundary conditions are imposed on the top surface of the domain. This leads to the first level in an algebraic multigrid solver (which can be extended by further conventional algebraic multigrid stages), and an additive smoother. We illustrate the method with numerical tests on unstructured meshes, which show that the preconditioner makes a dramatic improvement on a more standard multigrid preconditioner approach, and also show that the additive smoother produces better results than standard SOR smoothing. This new solver method makes it feasible to run nonhydrostatic unstructured mesh ocean models in small aspect ratio domains.Comment: submitted to Ocean Modellin

    Approximation of Bayesian inverse problems for PDEs

    Get PDF
    Inverse problems are often ill posed, with solutions that depend sensitively on data. In any numerical approach to the solution of such problems, regularization of some form is needed to counteract the resulting instability. This paper is based on an approach to regularization, employing a Bayesian formulation of the problem, which leads to a notion of well posedness for inverse problems, at the level of probability measures. The stability which results from this well posedness may be used as the basis for quantifying the approximation, in finite dimensional spaces, of inverse problems for functions. This paper contains a theory which utilizes this stability property to estimate the distance between the true and approximate posterior distributions, in the Hellinger metric, in terms of error estimates for approximation of the underlying forward problem. This is potentially useful as it allows for the transfer of estimates from the numerical analysis of forward problems into estimates for the solution of the related inverse problem. It is noteworthy that, when the prior is a Gaussian random field model, controlling differences in the Hellinger metric leads to control on the differences between expected values of polynomially bounded functions and operators, including the mean and covariance operator. The ideas are applied to some non-Gaussian inverse problems where the goal is determination of the initial condition for the Stokes or Navierā€“Stokes equation from Lagrangian and Eulerian observations, respectively

    A Constrained Approach to Multiscale Stochastic Simulation of\ud Chemically Reacting Systems

    Get PDF
    Stochastic simulation of coupled chemical reactions is often computationally intensive, especially if a chemical system contains reactions occurring on different time scales. In this paper we introduce a multiscale methodology suitable to address this problem. It is based on the Conditional Stochastic Simulation Algorithm (CSSA) which samples from the conditional distribution of the suitably defined fast variables, given values for the slow variables. In the Constrained Multiscale Algorithm (CMA) a single realization of the CSSA is then used for each value of the slow variable to approximate the effective drift and diffusion terms, in a similar manner to the constrained mean-force computations in other applications such as molecular dynamics. We then show how using the ensuing Stochastic Differential Equation (SDE) approximation, we can in turn approximate average switching times in stochastic chemical systems

    Conversations in the zone : collaborative learning in the counselor/student relationship

    Get PDF
    This study was an action research project using collaborative learning to inquire into my practice as a counselor working with nine first-generation college students in a federal TRIO Student Support Services (SSS) program at a land-grant university. The study followed the description of the history and parameters of my practice, my assumptions and reasons for interest in the initiative, a practical theory for addressing issues, and the reasons I believe collaborative learning reconciles practical and formal theories. My goal in this work was to move beyond an information-gathering role with students to a dialogical relationship in which we jointly construct knowledge. To this end, I initiated a phenomenological interview as part of the intake process for students applying to the SSS program and then followed this with a dialogue with students. Three students participated in the summer semester of 2000, and six in the fall semester of 2000. A change in procedure from the summer to fall semesters enhanced the sought-after conversational qualities I define as in the zone. I found elements of our dialogue that help to define this type of conversationā€”speech that carries its own momentum, playing with concepts, and use of images and metaphor. Experiential knowledge was also co-constructed within the zone. The study revealed phenomenological interviews to be an enlightening experience for students and myself. It also demonstrated that in-depth and image-rich conversations can help develop responsive relationships while preserving our respective roles. Results indicated that participants interpreted experiences through meaning perspectives and that the criteria for dialogue and expectations of participants had an effect on the quality of our conversations. An analysis of the action research project indicated that it does meet the criteria set out by Helen Bradbury and Peter Reason in the Handbook of Action Research. Beyond providing new knowledge and meeting quality standards, the study contributed to my practice by helping me to transcend a fear of engagement and thereby to be open to the experiences of others

    Variational data assimilation using targetted random walks

    Get PDF
    The variational approach to data assimilation is a widely used methodology for both online prediction and for reanalysis (offline hindcasting). In either of these scenarios it can be important to assess uncertainties in the assimilated state. Ideally it would be desirable to have complete information concerning the Bayesian posterior distribution for unknown state, given data. The purpose of this paper is to show that complete computational probing of this posterior distribution is now within reach in the offline situation. In this paper we will introduce an MCMC method which enables us to directly sample from the Bayesian\ud posterior distribution on the unknown functions of interest, given observations. Since we are aware that these\ud methods are currently too computationally expensive to consider using in an online filtering scenario, we frame this in the context of offline reanalysis. Using a simple random walk-type MCMC method, we are able to characterize the posterior distribution using only evaluations of the forward model of the problem, and of the model and data mismatch. No adjoint model is required for the method we use; however more sophisticated MCMC methods are available\ud which do exploit derivative information. For simplicity of exposition we consider the problem of assimilating data, either Eulerian or Lagrangian, into a low Reynolds number (Stokes flow) scenario in a two dimensional periodic geometry. We will show that in many cases it is possible to recover the initial condition and model error (which we describe as unknown forcing to the model) from data, and that with increasing amounts of informative data, the uncertainty in our estimations reduces

    MCMC methods for functions modifying old algorithms to make\ud them faster

    Get PDF
    Many problems arising in applications result in the need\ud to probe a probability distribution for functions. Examples include Bayesian nonparametric statistics and conditioned diffusion processes. Standard MCMC algorithms typically become arbitrarily slow under the mesh refinement dictated by nonparametric description of the unknown function. We describe an approach to modifying a whole range of MCMC methods which ensures that their speed of convergence is robust under mesh refinement. In the applications of interest the data is often sparse and the prior specification is an essential part of the overall modeling strategy. The algorithmic approach that we describe is applicable whenever the desired probability measure has density with respect to a Gaussian process or Gaussian random field prior, and to some useful non-Gaussian priors constructed through random truncation. Applications are shown in density estimation, data assimilation in fluid mechanics, subsurface geophysics and image registration. The key design principle is to formulate the MCMC method for functions. This leads to algorithms which can be implemented via minor modification of existing algorithms, yet which show enormous speed-up on a wide range of applied problems
    • ā€¦
    corecore